Temporal Structure Classification of Natural Languages by a Recurrent Reinforcement Network

نویسندگان

  • Peter Ford Dominey
  • Franck Ramus
چکیده

Human infants are sensitive at birth to the contrasting rhythms or prosodic structures of languages, that can serve to bootstrap acquisition of grammatical structure. We present a novel recurrent network architecture that simulates this sensitivity to different temporal structures. Recurrent connections in the network are non-modifiable, while forward connections from the recurrent network to the output layer are modified by a simple reinforcement rule. This avoids recurrent credit assignment complexity, and provides a flexible system for exploring the effects of temporal structure. The network is exposed to human speech that has been processed to preserve only the temporal component of prosodic structure. The network is trained to categorize individual sentences by their rhythm class, and can then generalize this learning to new sentences. These results demonstrate 1) a recurrent sequence learning architecture that is capable of learning and generalization of temporal structure, and 2) a neurophysiologically plausible mechanism by which human infants could extract the prosodic structure from natural language.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Speech Emotion Recognition Using Scalogram Based Deep Structure

Speech Emotion Recognition (SER) is an important part of speech-based Human-Computer Interface (HCI) applications. Previous SER methods rely on the extraction of features and training an appropriate classifier. However, most of those features can be affected by emotionally irrelevant factors such as gender, speaking styles and environment. Here, an SER method has been proposed based on a concat...

متن کامل

Elman Backpropagation as Reinforcement for Simple Recurrent Networks

Simple recurrent networks (SRNs) in symbolic time-series prediction (e.g., language processing models) are frequently trained with gradient descent--based learning algorithms, notably with variants of backpropagation (BP). A major drawback for the cognitive plausibility of BP is that it is a supervised scheme in which a teacher has to provide a fully specified target answer. Yet agents in natur...

متن کامل

Title: Neural Network Processing of Natural Language: I. Sensitivity to Serial, Temporal and Abstract Structure of Language in the Infant Authors:

Well before their first birthday, babies can acquire knowledge of serial order relations, as well as knowledge of more abstract rule-based structural relations between neighboring speech sounds within 2 minutes of exposure. These early learners can likewise acquire knowledge of rhythmic or temporal structure of a new language within 5-10 minutes of exposure. All three of these types of knowledg...

متن کامل

A C-LSTM Neural Network for Text Classification

Neural network models have been demonstrated to be capable of achieving remarkable performance in sentence and document modeling. Convolutional neural network (CNN) and recurrent neural network (RNN) are two mainstream architectures for such modeling tasks, which adopt totally different ways of understanding natural languages. In this work, we combine the strengths of both architectures and pro...

متن کامل

Liquid State Machine Optimization

In this thesis several possibilities are investigated for improving the performance of Liquid State Machines. A Liquid State Machine is a relatively new system that is a Machine Learning system, which is capable of coping with temporal dependencies. Basic Recurrent Neural Networks often have problems with this. One reason for this is that it takes a long time to train the Recurrent Neural Netwo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007